Disinformation is misleading content deliberately spread to deceive people, or to secure economic or political gain and which may cause public harm. Disinformation is an orchestrated adversarial activity in which actors employ strategic deceptions and media manipulation tactics to advance political, military, or commercial goals. Disinformation is implemented through coordinated campaigns that "weaponize multiple rhetorical strategies and forms of knowing—including not only falsehoods but also , , and Value judgment—to exploit and amplify and other identity-driven controversies."
In contrast, misinformation refers to inaccuracies that stem from inadvertent error. Misinformation can be used to create disinformation when known misinformation is purposefully and intentionally disseminated. "Fake news" has sometimes been categorized as a type of disinformation, but scholars have advised not using these two terms interchangeably or using "fake news" altogether in academic writing since politicians have weaponized it to describe any unfavorable news coverage or information.
Some consider it a loan translation of the Russian дезинформация, transliterated as , apparently derived from the title of a KGB black propaganda department. Soviet planners in the 1950s defined disinformation as "dissemination (in the press, on the radio, etc.) of false reports intended to mislead public opinion."
Disinformation first made an appearance in dictionaries in 1985, specifically, Webster's New College Dictionary and the American Heritage Dictionary. In 1986, the term disinformation was not defined in Webster's New World Thesaurus or New Encyclopædia Britannica. After the Soviet term became widely known in the 1980s, native speakers of English broadened the term as "any government communication (either overt or covert) containing intentionally false and misleading material, often combined selectively with true information, which seeks to mislead and manipulate either elites or a mass audience."
By 1990, use of the term disinformation had fully established itself in the English language within the lexicon of politics. By 2001, the term disinformation had come to be known as simply a more civil phrase for saying someone was lie. Stanley B. Cunningham wrote in his 2002 book The Idea of Propaganda that disinformation had become pervasively used as a synonym for propaganda.
Astroturfing | A centrally coordinated campaign that mimics grassroots activism by making participants pretend to be ordinary citizens | Fake news | Genre: The deliberate creation of pseudo-journalism Label: The instrumentalization of the term to delegitimize news media |
Conspiracy theories | Rebuttals of official accounts that propose alternative explanations in which individuals or groups act in secret | Greenwashing | Deceptive communication makes people believe that a company is environmentally responsible when it is not |
Clickbait | The deliberate use of misleading headlines and thumbnails to increase online traffic for profit or popularity | Propaganda | Organized mass communication, on a hidden agenda, and with a mission to conform belief and action by circumventing individual reasoning |
A phenomenon in which multiple groups of people, who hold entrenched values, attempt to steer public policy contentiously | Pseudoscience | Accounts that claim the explanatory power of science, borrow its language and legitimacy but diverge substantially from its quality criteria | |
Doxing | A form of online harassment that breaches privacy boundaries by releasing information intending physical and online harm to a target | Unsubstantiated news stories that circulate while not corroborated or validated | |
Echo chamber | An epistemic environment in which participants encounter beliefs and opinions that coincide with their own | Trolling | Networked groups of digital influencers that operate 'click armies' designed to mobilize public sentiment |
Hoax | News in which false facts are presented as legitimate | Moral tales featuring durable stories of intruders incurring boundary transgressions and their dire consequences |
In 2019, Camille François devised the "ABC" framework of understanding different modalities of online disinformation:
In 2020, the Brookings Institution proposed amending this framework to include Distribution, defined by the "technical protocols that enable, constrain, and shape user behavior in a virtual space". Similarly, the Carnegie Endowment for International Peace proposed adding Degree ("distribution of the content ... and the audiences it reaches") and Effect ("how much of a threat a given case poses").
Disinformation is primarily carried out by government intelligence agencies, but has also been used by non-governmental organizations and businesses. are a form of disinformation, as they mislead the public about their true objectives and who their controllers are. Most recently, disinformation has been deliberately spread through social media in the form of "fake news", disinformation masked as legitimate news articles and meant to mislead readers or viewers. Disinformation may include distribution of Forgery , manuscripts, and photographs, or spreading dangerous and fabricated intelligence. Use of these tactics can lead to blowback, however, causing such unintended consequences such as defamation lawsuits or damage to the dis-informer's reputation.
In October 1986, the term gained increased currency in the U.S. when it was revealed that two months previously, the Reagan Administration had engaged in a disinformation campaign against then-leader of Libya, Muammar Gaddafi. White House representative Larry Speakes said reports of a planned attack on Libya as first broken by The Wall Street Journal on August 25, 1986, were "authoritative", and other newspapers including The Washington Post then wrote articles saying this was factual. U.S. State Department representative Bernard Kalb resigned from his position in protest over the disinformation campaign, and said: "Faith in the word of America is the pulse beat of our democracy."
The executive branch of the Reagan administration kept watch on disinformation campaigns through three yearly publications by the Department of State: Active Measures: A Report on the Substance and Process of Anti-U.S. Disinformation and Propaganda Campaigns (1986); Report on Active Measures and Propaganda, 1986–87 (1987); and Report on Active Measures and Propaganda, 1987–88 (1989).
According to a report by Reuters, the United States ran a propaganda campaign to spread disinformation about the Sinovac Chinese COVID-19 vaccine, including using fake social media accounts to spread the disinformation that the Sinovac vaccine contained pork-derived ingredients and was therefore haram under Sharia. Reuters said the ChinaAngVirus disinformation campaign was designed to "counter what it perceived as China's growing influence in the Philippines" and was prompted by the "fear that China's COVID diplomacy and propaganda could draw other Southeast Asian countries, such as Cambodia and Malaysia, closer to Beijing". The campaign was also described as "payback for Beijing's efforts to blame Washington for the pandemic". The campaign primarily targeted people in the Philippines and used a social media hashtag for "China is the virus" in Tagalog language. The campaign ran from 2020 to mid-2021. The primary contractor for the U.S. military on the project was General Dynamics, which received $493 million for its role.
The politicisation of disinformation research in the United States. Since 2023, Republican members of the US Congress have attacked researchers who study disinformation as being against freedom of speech and as an euphemism for government censorship. On April 18 2025, citing an Executive Order signed by Trump, the US National Science Foundation released a statement cancelling funding for disinformation research, citing it does not fit with the NSF priorities, "including but not limited to those on diversity, equity, and inclusion (DEI) and misinformation/disinformation."
Whereas disinformation research focuses primarily on how actors orchestrate deceptions on social media, primarily via fake news, new research investigates how people take what started as deceptions and circulate them as their personal views. As a result, research shows that disinformation can be conceptualized as a program that encourages engagement in oppositional fantasies (i.e., ), through which disinformation circulates as rhetorical ammunition for never-ending arguments. As disinformation entangles with , identity-driven controversies constitute a vehicle through which disinformation disseminates on social media. This means that disinformation thrives, not despite raucous grudges but because of them. The reason is that controversies provide fertile ground for never-ending debates that solidify points of view.
Scholars have pointed out that disinformation is not only a foreign threat as domestic purveyors of disinformation are also leveraging traditional media outlets such as newspapers, radio stations, and television news media to disseminate false information. Current research suggests right-wing online political Activism in the United States may be more likely to use disinformation as a strategy and tactic. Governments have responded with a wide range of policies to address concerns about the potential threats that disinformation poses to democracy, however, there is little agreement in elite policy discourse or academic literature as to what it means for disinformation to threaten democracy, and how different policies might help to counter its negative implications.
Research after the 2016 election found: (1) for 14 percent of Americans social media was their "most important" source of election news; 2) known false news stories "favoring Trump were shared a total of 30 million times on Facebook, while those favoring Clinton were shared 8 million times"; 3) the average American adult saw fake news stories, "with just over half of those who recalled seeing them believing them"; and 4) people are more likely to "believe stories that favor their preferred candidate, especially if they have ideologically segregated social media networks." Correspondingly, whilst there is wide agreement that the digital spread and uptake of disinformation during the 2016 election was massive and very likely facilitated by foreign agents, there is an ongoing debate on whether all this had any actual effect on the election. For example, a double blind randomized-control experiment by researchers from the London School of Economics (LSE), found that exposure to online fake news about either Trump or Clinton had no significant effect on intentions to vote for those candidates. Researchers who examined the influence of Russian disinformation on Twitter during the 2016 US presidential campaign found that exposure to disinformation was (1) concentrated among a tiny group of users, (2) primarily among Republicans, and (3) eclipsed by exposure to legitimate political news media and politicians. Finally, they find "no evidence of a meaningful relationship between exposure to the Russian foreign influence campaign and changes in attitudes, polarization, or voting behavior." As such, despite its mass dissemination during the 2016 Presidential Elections, online fake news or disinformation probably did not cost Hillary Clinton the votes needed to secure the presidency.
Research on this topic remains inconclusive, for example, misinformation appears not to significantly change political knowledge of those exposed to it. There seems to be a higher level of diversity of news sources that users are exposed to on Facebook and Twitter than conventional wisdom would dictate, as well as a higher frequency of cross-spectrum discussion. Other evidence has found that disinformation campaigns rarely succeed in altering the foreign policies of the targeted states.
Research is also challenging because disinformation is meant to be difficult to detect and some social media companies have discouraged outside research efforts. For example, researchers found disinformation made "existing detection algorithms from traditional news media ineffective or not applicable...because is intentionally written to mislead readers...and users' social engagements with fake news produce data that is big, incomplete, unstructured, and noisy." Facebook, the largest social media company, has been criticized by analytical journalists and scholars for preventing outside research of disinformation.
Alternative perspectives have been proposed:
|
|